Search results for "Parameter optimization"
showing 7 items of 7 documents
Ultrasonic Welding of PBT-GF30 (70% Polybutylene Terephthalate + 30% Fiber Glass) and Expanded Polytetrafluoroethylene (e-PTFE)
2021
The ultrasonic welding of polymeric materials is one of the methods often used in practice. However, each couple of material subjected to ultrasonic welding is characterized by different values of technological parameters. Therefore, the main objective of the research presented in this paper is to optimize the parameters for the ultrasonic welding of two materials, namely PBT-GF30 (70% polybutylene terephthalate + 30% fiber glass) and expanded polytetrafluoroethylene (e-PTFE). In this sense, the research was carried out considering a plate-type part made of PBT-GF30, which had a thickness of 2.1 mm, and a membrane-type part made of e-PTFE, with a thickness of 0.3 mm. The condition imposed o…
Adjusted bat algorithm for tuning of support vector machine parameters
2016
Support vector machines are powerful and often used technique of supervised learning applied to classification. Quality of the constructed classifier can be improved by appropriate selection of the learning parameters. These parameters are often tuned using grid search with relatively large step. This optimization process can be done computationally more efficiently and more precisely using stochastic search metaheuristics. In this paper we propose adjusted bat algorithm for support vector machines parameter optimization and show that compared to the grid search it leads to a better classifier. We tested our approach on standard set of benchmark data sets from UCI machine learning repositor…
A heuristic, iterative algorithm for change-point detection in abrupt change models
2017
Change-point detection in abrupt change models is a very challenging research topic in many fields of both methodological and applied Statistics. Due to strong irregularities, discontinuity and non-smootheness, likelihood based procedures are awkward; for instance, usual optimization methods do not work, and grid search algorithms represent the most used approach for estimation. In this paper a heuristic, iterative algorithm for approximate maximum likelihood estimation is introduced for change-point detection in piecewise constant regression models. The algorithm is based on iterative fitting of simple linear models, and appears to extend easily to more general frameworks, such as models i…
Influence of Welding Time on Tensile-Shear Strength of Linear Friction Welded Birch (Betula pendula L.) Wood
2015
Linear friction welding of wood is a bonding process applied to wood and during which a stiff bond line is formed by the softening and rehardening of wood components to form a composite material composed mainly of wood fibres embedded in a modified lignin matrix. Unfortunately, the bonds tend to spontaneously delaminate or lose their strength when exposed to moist conditions. Some approaches were previously applied to overcome this problem, but so far a suitable solution has not been found. This paper presents results of applying post-welding thermal modification to reduce the moisture sensitivity of welded wood. The experiments included welding of birch wood, thermal modification under sup…
An LP-based hyperparameter optimization model for language modeling
2018
In order to find hyperparameters for a machine learning model, algorithms such as grid search or random search are used over the space of possible values of the models hyperparameters. These search algorithms opt the solution that minimizes a specific cost function. In language models, perplexity is one of the most popular cost functions. In this study, we propose a fractional nonlinear programming model that finds the optimal perplexity value. The special structure of the model allows us to approximate it by a linear programming model that can be solved using the well-known simplex algorithm. To the best of our knowledge, this is the first attempt to use optimization techniques to find per…
Implicit differentiation for fast hyperparameter selection in non-smooth convex learning
2022
International audience; Finding the optimal hyperparameters of a model can be cast as a bilevel optimization problem, typically solved using zero-order techniques. In this work we study first-order methods when the inner optimization problem is convex but non-smooth. We show that the forward-mode differentiation of proximal gradient descent and proximal coordinate descent yield sequences of Jacobians converging toward the exact Jacobian. Using implicit differentiation, we show it is possible to leverage the non-smoothness of the inner problem to speed up the computation. Finally, we provide a bound on the error made on the hypergradient when the inner optimization problem is solved approxim…
Online Hyperparameter Search Interleaved with Proximal Parameter Updates
2021
There is a clear need for efficient hyperparameter optimization (HO) algorithms for statistical learning, since commonly applied search methods (such as grid search with N-fold cross-validation) are inefficient and/or approximate. Previously existing gradient-based HO algorithms that rely on the smoothness of the cost function cannot be applied in problems such as Lasso regression. In this contribution, we develop a HO method that relies on the structure of proximal gradient methods and does not require a smooth cost function. Such a method is applied to Leave-one-out (LOO)-validated Lasso and Group Lasso, and an online variant is proposed. Numerical experiments corroborate the convergence …